Boosting Iris Recognition by Margin-Based Loss Functions

نویسندگان

چکیده

In recent years, the topic of contactless biometric identification has gained considerable traction due to COVID-19 pandemic. One most well-known technologies is iris recognition. Determining classification threshold for large datasets images remains challenging. To solve this issue, it essential extract more discriminatory features from images. Choosing appropriate loss function enhance discrimination power one significant factors in deep learning networks. This paper proposes a novel framework that integrates light-weight MobileNet architecture with customized ArcFace and Triplet functions. By combining two functions, possible improve compactness within class discrepancies between classes. reduce amount preprocessing, normalization step omitted segmented are used directly. contrast original SoftMax loss, EER combined decreased 1.11% 0.45%, TPR increased 99.77% 100%. CASIA-Iris-Thousand, 4.8% 1.87%, while improved 97.42% 99.66%. Experiments have demonstrated proposed approach using can significantly state-of-the-art achieve outstanding results.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Margin Maximizing Loss Functions

Margin maximizing properties play an important role in the analysis of classi£cation models, such as boosting and support vector machines. Margin maximization is theoretically interesting because it facilitates generalization error analysis, and practically interesting because it presents a clear geometric interpretation of the models being built. We formulate and prove a suf£cient condition fo...

متن کامل

Robust Loss Functions for Boosting

Boosting is known as a gradient descent algorithm over loss functions. It is often pointed out that the typical boosting algorithm, Adaboost, is highly affected by outliers. In this letter, loss functions for robust boosting are studied. Based on the concept of robust statistics, we propose a transformation of loss functions that makes boosting algorithms robust against extreme outliers. Next, ...

متن کامل

A Note on Margin-based Loss Functions in Classification by Yi Lin A Note on Margin-based Loss Functions in Classification

In many classification procedures, the classification function is obtained (or trained) by minimizing a certain empirical risk on the training sample. The classification is then based on the sign of the classification function. In recent years, there have been a host of classification methods proposed in machine learning that use different margin-based loss functions in the training. Examples i...

متن کامل

Scaling Boosting by Margin-Based Inclusionof Features and Relations

Boosting is well known to increase the accuracy of propositional and multi-relational classification learners. However, the base learner’s efficiency vitally determines boosting’s efficiency since the complexity of the underlying learner is amplified by iterated calls of the learner in the boosting framework. The idea of restricting the learner to smaller feature subsets in order to increase ef...

متن کامل

Smooth Boosting for Margin-Based Ranking

We propose a new boosting algorithm for bipartite ranking problems. Our boosting algorithm, called SoftRankBoost, is a modification of RankBoost which maintains only smooth distributions over data. SoftRankBoost provably achieves approximately the maximum soft margin over all pairs of positive and negative examples, which implies high AUC score for future data.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Algorithms

سال: 2022

ISSN: ['1999-4893']

DOI: https://doi.org/10.3390/a15040118